- The 30+ best Black Friday Samsung deals 2024
- Why enterprises still struggle to implement AI organization-wide (and what you can do about it)
- UK Justice System Failing Cybercrime Victims, Cyber Helpline Finds
- Best Black Friday deals 2024: 170+ sales live now featuring some of the lowest prices ever
- 기고 | 빠르게 변화하는 시대에 대응하는 IT리더의 특징 4가지
A Cisco IoT solution using edge intelligence use case
A Cisco IoT solution using edge intelligence goes to the open sea
Much like a virus, IoT sensors have been multiplying and spreading into the deepest corners of our industrial, domestic, and public environments. At the same time, we want to get insights into the data they collect in a centralized, consolidated and intuitive way. How do we accommodate those two extremes? And how can Cisco play a role in that effort? We left the mainland to find out, and ended up with a scalable, flexible solution fit for today’s IoT deployments.
For our specific use case, we worked with a customer specialized in the production of renewable energy. They are in the process of moving solar power production to the sea, while using environmentally friendly floating solar panels. This to take advantage of an area that until now has been unused and relieve the environmental impact solar power installations can have on-shore. Given the business criticality of energy production solutions, it was crucial for them to monitor the performance, environment, and productivity of these floating solar panels in real time, thus setting the stage for a Cisco IoT solution using Edge Intelligence.
Once afloat, the sensor data is to be sent to shore, and uploaded to an Azure IoT deployment functioning as a dashboard for sensor data monitoring. To accommodate the remote character of the sensors, the customer originally buffered data at the solar panel fleet itself, sending a big “blob” of data up to the cloud every 20 minutes.
However, the criticality of the installation demands timely detection of strange behavior and/or conditions, and the possibility real-time monitoring because of that. That’s where Edge Intelligence came in for performing timely data collection and transformation on-site, as well as real-time data streaming to the cloud. Smooth sailing ahead!
Use case
Leveraging the Cisco IoT portfolio, we enabled the following workflow:
- Collect environmental sensor data (temperature, humidity, pressure and motion) directly from BLE sensors.
- Normalize incoming data to a customized format, including string-to-numeric transformation, timestamp validation and local buffering.
- Publish data to Azure IoT, as well as a local MQTT broker/time series database.
Lab setup
From left to right, our setup consists of:
- Seven MT MultiTracker 110 sensors each generating different types of data (temperature, pressure, humidity, motion) over BLE
- Two Cisco Catalyst 9120 Access Points collecting sensor data over BLE, and consequently publishing it over MQTT.
- One Cisco IC300 Industrial Compute Gateway running an Edge Intelligence agent that is subscribed to the MQTT topics published to by the access points, transforms the incoming sensor values, and publishes the resulting MQTT data to the following destinations:
- A local MQTT broker deployed using a Mosquitto Docker container, attached to an InfluxDB time series database Docker container.
- Azure IoT
- IoT Operations Dashboard for managing the access points as “assets”, the IC3000 as an “EI agent” and Azure IoT/Mosquitto as “data destinations”.
Video: Edge Intelligence on IoT Devices
Edge Intelligence data logic
Cisco Edge Intelligence offers a plug-in for Microsoft Visual Studio Code, which we used to develop the data logic run on the IC3000. The plugin links to the assets, EI agents and data destinations as defined in the Cisco IoT Operations dashboard, and allows for direct deployment from VS Code. Below, we included the logic as developed for this project, yet other transformations can be deployed analogously.
Our output model defines the data format being published to our data destinations. Notably, the temperature value field was defined as a double instead of a string, in accordance to the normalization process aimed for in this use case.
[{"key":"macaddress","type":"STRING","category":"TELEMETRY"}, {"key":"sensorlabel","type":"STRING","category":"TELEMETRY"}, {"key":"temperature","type":"DOUBLE","category":"TELEMETRY"}, {"key":"sensortiming","type":"STRING","category":"TELEMETRY"}, {"key":"routertiming","type":"STRING","category":"TELEMETRY"}]
Our data logic defines the data transformation and/or publishing as deployed on our IC3000. Below, a code snippet is included that transforms incoming sensor data into labeled, timestamped and normalized MQTT data destined for Azure IoT/our local MQTT broker. Through the Edge Intelligence plugin for VS Code, this code can be uploaded to IoT Operations Dashboard, as well as deployed to an Edge Intelligence agent.
var MAC_ADDRESSES = [ 'c31d16d0208c', ... // other MAC addresses omitted 'ccaffe5d4854' ]; var LABELS = [ 'sensor1', ... // other labels omitted 'sensor7' ]; // for local buffering var TEMP_READINGS = [null, null, null, null, null, null, null]; var TIMINGS = [null, null, null, null, null, null, null]; // Run on every sensor value update function on_update() { // Read and parse input data per sensor var sensorReadings = [ JSON.parse(ftuhustypemqtt.c31d16d0208c), ... // other parsing omitted JSON.parse(ftuhustypemqtt.ccaffe5d4854) ]; var i; // Transform and publish temperature for each sensor for (i = 0; i < sensorReadings.length; i++) { if (sensorReadings[i] !== null) { if (sensorReadings[i]["type"] == "measurement") { TIMINGS[i] = sensorReadings[i]["time"]; if (sensorReadings[i]["id"] == "temperature") { TEMP_READINGS[i] = sensorReadings[i]["value"]; } } else { TEMP_READINGS[i] = TEMP_READINGS[i]; TIMINGS[i] = TIMINGS[i]; } output.macaddress = MAC_ADDRESSES[i]; output.sensorlabel = LABELS[i]; output.temperature = TEMP_READINGS[i]; output.sensortiming = TIMINGS[i]; // sensor timestamp output.routertiming = (new Date(Date.now())).toISOString();// EI timestamp output.publish(); // publish data }
Azure IoT publishing results
Leveraging the MQTT broker, and time series database capabilities in Azure IoT, the following real-time environmental analytics were collected in a graphical format. These readings were parsed from incoming MQTT data, as published by our lab setup.
Related resources
We’d love to hear what you think. Ask a question or leave a comment below.
And stay connected with Cisco DevNet on social!
Twitter @CiscoDevNet | Facebook | LinkedIn
Visit the new Developer Video Channel
Share: